This can be hard for onlookers to understand, but for people who have lived through trauma, chronic emotional invalidation, or unsafe relationships, self-blame can become an organizing principle. It offers a painful kind of order. If suffering is my fault, then at least it makes sense. Over time, that belief does not stay confined to memory. It begins to shape behavior.
In late May 2023, Sharon Maxwell posted screenshots that should have changed everything. Maxwell, struggling with an eating disorder since childhood, had turned to Tessa-a chatbot created by the National Eating Disorders Association. The AI designed to prevent eating disorders gave her a detailed plan to develop one. Lose 1-2 pounds per week, Tessa advised. Maintain a 500-1,000 calorie daily deficit. Measure your body fat with calipers.
The rising cost of living continues to strain many households, and interruptions to food assistance programs during the temporary government shutdown added new stress for those already trying to stay afloat. According to the U.S. Department of Agriculture, in 2023, about 13.5 percent of U.S. households, including 7.2 million children, experienced food insecurity. This is an increase over the previous year's figures, highlighting how quickly families can slip into hardship when basic needs become unstable.
Before I explain, I want to clarify that I firmly believe in body autonomy. If someone chooses to take a weight loss medication, they should be able to do so without judgment. I hope all potential users are fully informed about the risks and benefits of these medications and are followed responsibly by medical providers. Ideally, they would also be screened for a current or past eating disorder or any other condition that might contraindicate the use of GLP-1s and GIPs.
Engagement is the highest priority of chatbot programming, intended to seduce users into spending maximum time on screens. This makes chatbots great companions-they are available 24/7, always agreeable, understanding, and empathic, while never judgmental, confronting, or reality testing. But chatbots can also become unwitting collaborators, harmfully validating self-destructive eating patterns and body image distortions of patients with eating disorders. Engagement and validation are wonderful therapeutic tools for some problems, but too often are dangerous accelerants for eating disorders.
To see how deep this connection runs, this team decided to test it out using Kool-Aid. The study introduced a group of mice to Grape Kool-Aid, a novel flavor to them. Half of the mice were then injected with lithium chloride, a chemical that induces nausea.